290 research outputs found

    Effects of R-Alpha Lipoic Acid on HbA1c, Lipids and Blood Pressure in Type-2 Diabetics: A Preliminary Study

    Get PDF
    R-alpha lipoic acid (R-ALA) supplementation improves blood glucose in diabetic animals, but there have been no long-term clinical trials in humans testing its use for glucose control (HbA1c). This double-blind study pre-/post-test control group (PL) design sought to determine the effect of R-ALA on HbA1c. Twenty type-2 diabetics were randomly assigned to 200 mg capsules of R-ALA (n=13; 8M 5F) or PL (n=7; 2M 5F) 3 times daily, 30 minutes before meals (600 mg total) for 91 days. Samples were obtained for HbA1c at baseline and day 91. No significant differences between R-ALA and PL groups were found at baseline or day 91. However, three distinct reactions to the supplement were noted. The first group (n=3) responded to R-ALA with a >25% drop in HbA1c range from 6.1-12.5 to 6.2- 9.0 mg/dL and/or halved their anti-diabetic medication. The second group (n=5) had no change in HbA1c. The third group (n=5) had changes in medication or concurrent chronic adverse events that should have raised HbA1c, but did not beyond that of the placebo. Conclusions: Three months of R-ALA supplementation may lower HbA1c in a small number of individuals. However, to further confirm these findings, larger studies of longer duration are needed

    Proof-theoretic semantics, a problem with negation and prospects for modality

    Get PDF
    This paper discusses proof-theoretic semantics, the project of specifying the meanings of the logical constants in terms of rules of inference governing them. I concentrate on Michael Dummett’s and Dag Prawitz’ philosophical motivations and give precise characterisations of the crucial notions of harmony and stability, placed in the context of proving normalisation results in systems of natural deduction. I point out a problem for defining the meaning of negation in this framework and prospects for an account of the meanings of modal operators in terms of rules of inference

    General-elimination stability

    Get PDF
    General-elimination harmony articulates Gentzen's idea that the elimination-rules are justified if they infer from an assertion no more than can already be inferred from the grounds for making it. Dummett described the rules as not only harmonious but stable if the E-rules allow one to infer no more and no less than the I-rules justify. Pfenning and Davies call the rules locally complete if the E-rules are strong enough to allow one to infer the original judgement. A method is given of generating harmonious general-elimination rules from a collection of I-rules. We show that the general-elimination rules satisfy Pfenning and Davies' test for local completeness, but question whether that is enough to show that they are stable. Alternative conditions for stability are considered, including equivalence between the introduction- and elimination-meanings of a connective, and recovery of the grounds for assertion, finally generalizing the notion of local completeness to capture Dummett's notion of stability satisfactorily. We show that the general-elimination rules meet the last of these conditions, and so are indeed not only harmonious but also stable.Publisher PDFPeer reviewe

    The open future, bivalence and assertion

    Get PDF
    It is highly intuitive that the future is open and the past is closed—whereas it is unsettled whether there will be a fourth world war, it is settled that there was a first. Recently, it has become increasingly popular to claim that the intuitive openness of the future implies that contingent statements about the future, such as ‘there will be a sea battle tomorrow,’ are non-bivalent (neither true nor false). In this paper, we argue that the non-bivalence of future contingents is at odds with our pre-theoretic intuitions about the openness of the future. These are revealed by our pragmatic judgments concerning the correctness and incorrectness of assertions of future contingents. We argue that the pragmatic data together with a plausible account of assertion shows that in many cases we take future contingents to be true (or to be false), though we take the future to be open in relevant respects. It follows that appeals to intuition to support the non-bivalence of future contingents is untenable. Intuition favours bivalence

    A Bell Inequality Analog in Quantum Measure Theory

    Get PDF
    One obtains Bell's inequalities if one posits a hypothetical joint probability distribution, or {\it measure}, whose marginals yield the probabilities produced by the spin measurements in question. The existence of a joint measure is in turn equivalent to a certain causality condition known as ``screening off''. We show that if one assumes, more generally, a joint {\it quantal measure}, or ``decoherence functional'', one obtains instead an analogous inequality weaker by a factor of 2\sqrt{2}. The proof of this ``Tsirel'son inequality'' is geometrical and rests on the possibility of associating a Hilbert space to any strongly positive quantal measure. These results lead both to a {\it question}: ``Does a joint measure follow from some quantal analog of `screening off'?'', and to the {\it observation} that non-contextual hidden variables are viable in histories-based quantum mechanics, even if they are excluded classically.Comment: 38 pages, TeX. Several changes and added comments to bring out the meaning more clearly. Minor rewording and extra acknowledgements, now closer to published versio

    Implicit complexity for coinductive data: a characterization of corecurrence

    Full text link
    We propose a framework for reasoning about programs that manipulate coinductive data as well as inductive data. Our approach is based on using equational programs, which support a seamless combination of computation and reasoning, and using productivity (fairness) as the fundamental assertion, rather than bi-simulation. The latter is expressible in terms of the former. As an application to this framework, we give an implicit characterization of corecurrence: a function is definable using corecurrence iff its productivity is provable using coinduction for formulas in which data-predicates do not occur negatively. This is an analog, albeit in weaker form, of a characterization of recurrence (i.e. primitive recursion) in [Leivant, Unipolar induction, TCS 318, 2004].Comment: In Proceedings DICE 2011, arXiv:1201.034

    Models of HoTT and the Constructive View of Theories

    Get PDF
    Homotopy Type theory and its Model theory provide a novel formal semantic framework for representing scientific theories. This framework supports a constructive view of theories according to which a theory is essentially characterised by its methods. The constructive view of theories was earlier defended by Ernest Nagel and a number of other philosophers of the past but available logical means did not allow these people to build formal representational frameworks that implement this view

    Radical anti-realism and substructural logics

    No full text
    We first provide the outline of an argument in favour of a radical form of anti-realism premised on the need to comply with two principles, implicitness and immanence, when trying to frame assertability-conditions. It follows from the first principle that one ought to avoid explicit bounding of the length of computations, as is the case for some strict finitists, and look for structural weakening instead. In order to comply with the principle of immanence, one ought to take into account the difference between being able to recognize a proof when presented with one and being able to produce one and thus avoid the idealization of our cognitive capacities that arise within Hilbert-style calculi. We then explore the possibility of weakening structural rules in order to comply with radical anti-realist strictures
    • …
    corecore